Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 8.0 years
6 - 8 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
6+ years of total IT experience in development projects. 4+ years experience in cloud-based solutions. 4+ years in solid hands on snowflake development. Hands on experience in Design, build data pipelines on cloud-based infrastructure having extensively worked on AWS, snowflake, Having done end to end build from ingestion, transformation, and extract generation in Snowflake. Strong hand-on experience in writing complex SQL queries. Good understanding and experience in Azure cloud services. Optimize and tune snowflake performance including query optimization and have experience in scaling strategies. Address data issues, root cause analysis and production support. Experience working in a Financial Industry. Understanding Agile methodologies. Certification on Snowflake and Azure will be added advantage.
Posted 13 hours ago
6.0 - 10.0 years
15 - 30 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking an experienced AWS, Snowflake, and DBT professional to join our team in India. The ideal candidate will have a strong background in cloud technologies and data management, with a focus on delivering high-quality data solutions that drive business insights. Responsibilities Design, implement, and manage AWS cloud solutions to optimize performance and cost-efficiency. Utilize Snowflake for data warehousing and analytics, ensuring data integrity and security. Develop and maintain ETL pipelines using DBT to transform raw data into actionable insights. Collaborate with cross-functional teams to gather requirements and deliver data solutions. Monitor and troubleshoot cloud-based applications and databases to ensure reliability and performance. Stay updated with the latest trends and advancements in cloud technologies and data management. Skills and Qualifications 6-10 years of experience in cloud technologies, specifically AWS. Strong proficiency in Snowflake, including data modeling, ETL processes, and performance tuning. Experience with DBT for data transformation and analytics. In-depth knowledge of SQL and database management. Familiarity with data governance and data quality practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Posted 14 hours ago
6.0 - 9.0 years
15 - 30 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are looking for an experienced Snowflake on Azure professional to join our team in India. The ideal candidate will have a strong background in data warehousing solutions, particularly with Snowflake, and experience working in cloud environments, specifically Azure. Responsibilities Design and implement data models and ETL processes using Snowflake on Azure. Manage and optimize Snowflake performance and costs. Collaborate with data engineering and analytics teams to ensure data availability and integrity. Develop and maintain documentation for data pipelines and architecture. Troubleshoot and resolve data-related issues in a timely manner. Stay updated with the latest trends and features in Snowflake and Azure. Skills and Qualifications 6-9 years of experience in data warehousing solutions, specifically with Snowflake. Strong knowledge of Azure cloud services and architecture. Proficient in SQL and data modeling techniques. Experience with ETL tools and data integration methods. Understanding of data governance and security best practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.
Posted 14 hours ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Creating and modifying terraform files to accommodate the evolving business needs Running the CI/CD checks (Teamcity / Jenkins) and analyzing them upon failure Identifying and rectifying Data flow issues within Snowflake. Addressing customer requests raised by Engineering team and Data Analysts consuming data. Should have prior experience working on terraform files. Should be able to read and comprehend existing terraform files. Should be able to make changes to the existing files based on the requirement. Should be able to create new terraform files based on the introduction of a new service. Qualifications: 5+ Experience with Terraform Should have prior experience working on terraform files.
Posted 14 hours ago
3.0 - 7.0 years
3 - 7 Lacs
Pune, Maharashtra, India
On-site
We're searching for an experienced ADF/SSIS Developer to join our data engineering team. This role is ideal for a professional with a strong background in data warehousing and a proven track record in cloud computing, particularly within the Azure ecosystem. You'll be instrumental in migrating and developing robust ETL solutions, leveraging Azure services and various database technologies. Key Responsibilities Design, develop, and maintain ETL (Extract, Transform, Load) processes using Azure Data Factory (ADF) and SQL Server Integration Services (SSIS) . Lead and participate in the migration of conventional ETL processes (SSIS) from on-premise environments to the Azure cloud landscape . Work extensively with and manage data in cloud databases such as Azure SQL and Azure Synapse , as well as Snowflake database . Develop and optimize complex SQL queries and stored procedures, preferably within a SQL Server environment. Collaborate with data architects and other developers to ensure data integrity, quality, and performance across all data pipelines. Troubleshoot and resolve data-related issues, ensuring data accuracy and system reliability. Required Skills & Experience Data Engineering/Data Warehousing Development/Operations Experience: 4+ years. Cloud Computing Experience: Minimum 2 years, with a focus on Azure. ETL Migration: Reasonable experience migrating conventional ETL (SSIS) processes from on-premise to Azure. Database Experience:Azure SQL Azure Synapse Snowflake database SQL Development: Proven experience in SQL development, preferably with SQL Server.
Posted 15 hours ago
6.0 - 8.0 years
2 - 4 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are looking for a Senior ETL/Data Engineer with 6-8 years of experience to join our Data Platform Operations team . The ideal candidate will have hands-on experience with data warehouse ETL tools and technologies such as Ab Initio, Snowflake, Oracle, and Control-M . The role involves supporting ETL platforms on AWS , ensuring scalability, high availability, and operational excellence. Must-Have Skills: ???? ETL & Data Warehousing: Strong expertise in ETL architecture and development (preferably Ab Initio ). Deep understanding of data warehouse concepts and architecture . Proficiency in SQL (Oracle, MySQL, Snowflake, etc.) . Experience with enterprise scheduling tools like Control-M . Extensive operations support experience in ETL applications. Willingness to work in UK shift timings . ???? Programming & Development: Strong scripting skills ( Unix, Python, etc. ). Experience in system upgrades and application maintenance . Familiarity with version control tools like Git . ???? Project-Level Responsibilities: Collaborate with developers and architects to enforce best practices. Solid understanding of database architectures and data models . Ability to create and maintain technical documentation . Experience working in Agile development environments . Coordinate effectively with onshore teams . Good-to-Have Skills: AWS Cloud Knowledge , including AWS data services ( Lambda, EMR, Athena, Glue, Redshift ). Experience with dashboarding and metrics collection tools like Grafana and Datadog . Exposure to automating monitoring frameworks for critical business processes. Education Qualification: BE/B.Tech/MCA in Computer Science, Information Technology, or a related field. If you're an ETL and data engineering expert looking for a challenging and rewarding role, apply now! ????
Posted 16 hours ago
6.0 - 10.0 years
2 - 4 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
We are seeking a Senior QA Engineer with 6-10 years of experience in ETL Testing, Data Warehousing, and Snowflake to join our team. The ideal candidate will have a strong background in SQL debugging, data quality testing, and defect resolution within a Capital Markets environment. Key Responsibilities: ETL & Data Warehouse Testing: Conduct comprehensive ETL and Snowflake testing , ensuring data quality, consistency, and integrity . SQL Debugging & Optimization: Strong expertise in SQL query debugging and performance tuning to troubleshoot data issues. Testing Automation & Tools: Work with Qlik Replicate, Compose (CDC tools), and Talend Cloud Data Integration for automated testing. Defect & Test Management: Utilize JIRA, Xray, and other defect tracking tools for issue management. Financial Domain Expertise: Exposure to Capital Markets, Bloomberg, Reuters, MSCI, and other third-party financial data providers is a plus. Collaboration Across Regions: Work across APAC, EMEA, and NA regions to identify root causes of issues and implement long-term solutions. Stakeholder Communication: Effectively document, communicate, and escalate issues to ensure alignment across teams. Experience with CRD (Charles River Development) is a strong advantage. Required Qualifications: ? 6-10 years of experience in ETL Testing, Data Warehousing, and Snowflake . ? Strong expertise in SQL query debugging and performance optimization . ? Experience with Azure and Snowflake Testing is a plus. ? Hands-on experience with Talend Cloud Data Integration, Pentaho/Kettle is desirable. ? Experience with JIRA, Xray, and defect management tools is preferred. ? Strong problem-solving and analytical skills with a proactive approach to issue resolution . ? Ability to work under tight deadlines and high-pressure environments . ? Self-motivated, team-oriented, and an excellent communicator . Why Join Us ? Work on cutting-edge Capital Markets and Financial Data projects. ? Be part of a global team , collaborating across APAC, EMEA, and NA . ? Opportunity to innovate, optimize, and implement high-impact solutions . If you are a detail-oriented QA professional with a passion for data testing, problem-solving, and capital markets , apply now! ???? Keywords: QA Testing, ETL, Data Warehousing, Snowflake, SQL Debugging, JIRA, Capital Markets, CRD, Talend, Qlik Replicate Apply Now!
Posted 16 hours ago
6.0 - 10.0 years
2 - 4 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
We are looking for a Data Integration Developer / Senior Developer to join our team, responsible for the development and support of the Alpha Data Platform (ADP) Services . The ideal candidate will have expertise in ETL/ELT pipelines, Snowflake DWH, and data-readiness solutions , ensuring high-quality, efficient data integration for investment data management. Key Responsibilities: Data Pipeline Development: Hands-on experience in building and maintaining ETL/ELT pipelines , ensuring data quality and integrity. Performance Optimization: Implement query performance tuning, exception handling , and data load optimizations to meet SLAs. Solution Design & Innovation: Define data integration solutions that handle high-volume, multi-source data with complex relationships. Collaboration & Documentation: Work across APAC, EMEA, and NA regions to establish design standards, documentation, and onboarding activities . Governance & Compliance: Adhere to SDLC processes , including peer code reviews, unit testing, deployment, and security scanning . Troubleshooting & Issue Resolution: Conduct root cause analysis, SQL query debugging , and resolve defects in collaboration with business and IT teams. Required Qualifications: ? B.Tech in Computer Science or a related technical field. ? 6-10 years of experience in data integration, orchestration services, and service architecture . ? Expertise in Microsoft Azure Cloud, Snowflake SQL , and database query/performance tuning . ? Strong understanding of Data Warehousing concepts and experience with ETL tools like Talend Cloud Data Integration . ? Proficiency in SQL debugging and performance tuning . ? Experience in CI/CD deployment pipelines and cloud-managed services such as GitHub and Azure DevOps . ? Financial domain knowledge is a plus. ? Experience with Qlik Replicate and Compose (CDC tools) is an advantage. ? Familiarity with third-party data providers such as Bloomberg, Reuters, and MSCI is a plus. ? Prior experience with State Street and Charles River Development (CRD) is a plus. If you're a data-driven professional passionate about building scalable, high-performance data integration solutions , apply now!
Posted 16 hours ago
4.0 - 9.0 years
4 - 9 Lacs
Chennai, Tamil Nadu, India
On-site
Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)
Posted 16 hours ago
4.0 - 9.0 years
4 - 9 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)
Posted 16 hours ago
2.0 - 5.0 years
2 - 5 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python Experience in Data Migration from RDBMS to Snowflake cloud data warehouse Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Experience with data security and data access controls and design- Experience with AWS or Azure data storage and management technologies such as S3 and Blob Build processes supporting data transformation, data structures, metadata, dependency and workload management- Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. Must have experience of Agile development methodologies. Good to have CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, database. Job conductor, scheduler and monitoring. GIT repository, creating user & roles and provide access to them. Agile methodology and 24/7 Admin and Platform support. Estimation of effort based on the requirement. Strong written communication skills. Is effective and persuasive in both written and oral communication.
Posted 16 hours ago
10.0 - 14.0 years
10 - 14 Lacs
Delhi, India
On-site
Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues
Posted 17 hours ago
10.0 - 14.0 years
10 - 14 Lacs
Pune, Maharashtra, India
On-site
Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues
Posted 17 hours ago
10.0 - 14.0 years
10 - 14 Lacs
Chennai, Tamil Nadu, India
On-site
Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues
Posted 17 hours ago
5.0 - 7.0 years
5 - 7 Lacs
Mumbai, Maharashtra, India
On-site
About the Role - We are looking for an experienced Snowflake Admin to manage and optimize Snowflake cloud data platforms. The ideal candidate should have strong expertise in Snowflake architecture, performance tuning, security, and administration. This role requires the ability to troubleshoot issues, automate processes, and collaborate with cross-functional teams. Key Responsibilities: Administer and optimize Snowflake environments for performance and security. Manage user roles, permissions, and access controls. Implement best practices for database performance tuning and query optimization. Monitor system performance and troubleshoot issues proactively Work with data engineering teams to support ETL processes and integrations. Automate administrative tasks using SQL and scripting. Required Skills: 5+ years of experience in Snowflake administration. Expertise in Snowflake architecture, data sharing, and workload optimization. Strong knowledge of SQL, Python/Shell scripting for automation. Experience with data security, access management, and governance policies. Understanding of cloud environments (AWS/Azure/GCP) and Snowflake integrations. Contract Duration: 3 Months (C2C)
Posted 17 hours ago
5.0 - 8.0 years
5 - 8 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have
Posted 19 hours ago
3.0 - 5.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Req ID: 325282 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake to join our team in Hyderabad, Telangana (IN-TG), India (IN). Snowflake and Data Vault 2 (optional) Consultant Extensive expertise in DBT , including macros, modeling, and automation techniques. Proficiency in SQL, Python , or other scripting languages for automation. Experience leveraging Snowflake for scalable data solutions. Familiarity with Data Vault 2.0 methodologies is an advantage. Strong capability in optimizing database performance and managing large datasets. Excellent problem-solving and analytical skills. Minimum of 3+ years of relevant experience, with a total of 5+ years of overall experience. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 day ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 day ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .
Posted 1 day ago
0.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Inviting applications for the role of Lead Consultant -Data Engineer! . Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, and airflow. . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. . Actively monitor and triage technical challenges in critical situations that require immediate resolution. . Evaluate viable technical solutions and share MVPs or PoCs in support of the research . Develop relationships with external stakeholders to maintain awareness of data and security issues and trends . Review work from other tech team members and provide feedback for growth . Implement Data Performance and data security policies that align with governance objectives and regulatory requirements . Effectively mentor and develop your team members . You have experience in data warehousing, data modeling, and the building of data engineering pipelines. . You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. . You are good at analyzing performance bottlenecks and providing enhancement recommendations you have a passion for customer service and a desire to learn and grow as a professional and a technologist. . Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. . Collaborating with product owners to identify requirements, define desired and deliver trusted results. . Building processes supporting data transformation, data structures, metadata, dependency, and workload management. . In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python. . Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred). . Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Extremely talented in applying SCD, CDC, and DQ/DV framework. . Familiar with JIRA & Confluence. . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake. . Desire to continually keep up with advancements in data engineering practices. Qualifications we seek in you! Minimum qualifications: Essential Education Bachelor%27s degree or equivalent combination of education and experience. Bachelor%27s degree in information science, data management, computer science or related field preferred. Essential Experience & Job Requirements . IT experience with a major focus on data warehouse/database-related projects . Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake. . Experience in other data platforms: Oracle, SQL Server, MDM, etc . Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. . Experience in data modeling and relational database design . Well-versed in applying SCD, CDC, and DQ/DV framework. . Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake . Good to have strong programming/ scripting skills (Python, PowerShell, etc.) . Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) o Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels o Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) Preferred Qualifications Knowledge of AWS cloud, and Python is a plus. . . . . . .
Posted 1 day ago
1.0 - 3.0 years
1 - 3 Lacs
Chennai, Tamil Nadu, India
On-site
AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices
Posted 1 day ago
5.0 - 8.0 years
2 - 11 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Data Warehouse Solution Design & Development : Lead the design and implementation of batch and real-time ingestion architectures for data warehouses. Ensure that solutions are scalable, reliable, and optimized for performance. Team Leadership & Mentoring : Lead and mentor a team of data engineers , fostering a collaborative environment to encourage knowledge sharing and continuous improvement. Ensure that the team meets high standards of quality and performance. Hands-on Technical Delivery : Actively engage in hands-on development and ensure seamless delivery of data solutions. Provide technical direction and hands-on support for complex issues. Issue Resolution & Troubleshooting : Capable of troubleshooting issues that arise during runtime and providing quick resolutions to minimize disruptions and maintain system stability. API Management : Oversee the integration and management of APIs using APIM for seamless communication between internal and external systems. Implement and maintain API gateways and monitor API performance. Client Communication : Interact directly with clients , ensuring clear and convincing communication of technical ideas and project progress. Translate customer requirements into technical solutions and drive the implementation process. Cloud & DevOps : Ensure that the data solutions are designed with cloud-native technologies such as Azure , Snowflake , and DBT . Use Azure DevOps for continuous integration and deployment pipelines. Mentoring & Best Practices : Guide the team on best practices for data engineering , code reviews , and performance optimization . Ensure the adoption of modern tools and techniques to improve delivery efficiency. Mandatory Skills : Python for data engineering Snowflake and Postgres development experience Proficient in API Management (APIM) and DBT Strong experience with Azure DevOps for CI/CD Proven experience in data warehouse solutions design, development, and implementation Desired Skills : Experience with Apache Kafka , Azure Event Hub , Apache Airflow , Apache Flink Familiarity with Grafana , Prometheus , Terraform , Kubernetes Power BI for reporting and data visualization
Posted 1 day ago
2.0 - 6.0 years
2 - 7 Lacs
Bengaluru / Bangalore, Karnataka, India
Remote
Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools
Posted 1 day ago
2.0 - 6.0 years
2 - 7 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
Remote
Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools
Posted 1 day ago
2.0 - 6.0 years
2 - 7 Lacs
Delhi, India
Remote
Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2